Directional Distributional Similarity for Lexical Expansion
نویسندگان
چکیده
Distributional word similarity is most commonly perceived as a symmetric relation. Yet, one of its major applications is lexical expansion, which is generally asymmetric. This paper investigates the nature of directional (asymmetric) similarity measures, which aim to quantify distributional feature inclusion. We identify desired properties of such measures, specify a particular one based on averaged precision, and demonstrate the empirical benefit of directional measures for expansion.
منابع مشابه
Directional distributional similarity for lexical inference
Distributional word similarity is most commonly perceived as a symmetric relation. Yet, directional relations are abundant in lexical semantics and in many Natural Language Processing (NLP) settings that require lexical inference, making symmetric similarity measures less suitable for their identification. This paper investigates the nature of directional (asymmetric) similarity measures that a...
متن کاملThe Distributional Similarity Of Sub-Parses
This work explores computing distributional similarity between sub-parses, i.e., fragments of a parse tree, as an extension to general lexical distributional similarity techniques. In the same way that lexical distributional similarity is used to estimate lexical semantic similarity, we propose using distributional similarity between subparses to estimate the semantic similarity of phrases. Suc...
متن کاملCo-occurrence Retrieval: A Flexible Framework for Lexical Distributional Similarity
Techniques that exploit knowledge of distributional similarity between words have been proposed in many areas of Natural Language Processing. For example, in language modeling, the sparse data problem can be alleviated by estimating the probabilities of unseen co-occurrences of events from the probabilities of seen co-occurrences of similar events. In other applications, distributional similari...
متن کاملM ODELS by Tong Wang A thesis submitted in conformity with the requirements for the degree of Doctor of Philosophy
Exploiting Linguistic Knowledge in Lexical and Compositional Semantic Models Tong Wang Doctor of Philosophy Graduate Department of Computer Science University of Toronto 2016 A fundamental principle in distributional semantic models is to use similarity in linguistic environment as a proxy for similarity in meaning. Known as the distributional hypothesis, the principle has been successfully app...
متن کاملText: now in 2D! A framework for lexical expansion with contextual similarity
A new metaphor of two-dimensional text for data-driven semantic modeling of natural language is proposed, which provides an entirely new angle on the representation of text: not only syntagmatic relations are annotated in the text, but also paradigmatic relations are made explicit by generating lexical expansions. We operationalize dis-tributional similarity in a general framework for large cor...
متن کامل